Coordinate and Subspace Optimization Methods for Linear Least Squares with Non-Quadratic Regularization

نویسندگان

  • Michael Elad
  • Boaz Matalon
  • Michael Zibulevsky
چکیده

This work addresses the problem of regularized linear least squares (RLS) with non-quadratic separable regularization. Despite being frequently deployed in many applications, the RLS problem is often hard to solve using standard iterative methods. In a recent work [10], a new iterative method called Parallel Coordinate Descent (PCD) was devised. We provide herein a convergence analysis of the PCD algorithm, and also introduce a form of the regularization function, which permits analytical solution to the coordinate optimization. Several other recent works [6, 12, 13], which considered the deblurring problem in a Bayesian methodology, also obtained element-wise optimization algorithms. We show that these three methods are essentially equivalent, and the unified method is termed Separable Surrogate Functionals (SSF). We also provide a convergence analysis for SSF. To further accelerate PCD and SSF, we merge them into a recently developed sequential subspace optimization technique (SESOP), with almost no additional complexity. A thorough numerical comparison of the denoising application is presented, using the Basis Pursuit Denoising (BPDN) objective function, which leads all of the above algorithms to an iterated shrinkage format. Both with synthetic data and with real images, the advantage of the combined PCD-SESOP method is clearly demonstrated.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Least squares problems with inequality constraints as quadratic constraints

Linear least squares problems with box constraints are commonly solved to find model parameters within bounds based on physical considerations. Common algorithms include Bounded Variable Least Squares (BVLS) and the Matlab function lsqlin. Here, we formulate the box constraints as quadratic constraints, and solve the corresponding unconstrained regularized least squares problem. Box constraints...

متن کامل

On the Solution of the Tikhonov Regularization of the Total Least Squares Problem

Total least squares (TLS) is a method for treating an overdetermined system of linear equations Ax ≈ b, where both the matrix A and the vector b are contaminated by noise. Tikhonov regularization of the TLS (TRTLS) leads to an optimization problem of minimizing the sum of fractional quadratic and quadratic functions. As such, the problem is nonconvex. We show how to reduce the problem to a sing...

متن کامل

Linear Convergence of Proximal-Gradient Methods under the Polyak-Łojasiewicz Condition

In 1963, Polyak proposed a simple condition that is sufficient to show that gradient descent has a global linear convergence rate. This condition is a special case of the Łojasiewicz inequality proposed in the same year, and it does not require strong-convexity (or even convexity). In this work, we show that this much-older Polyak-Łojasiewicz (PL) inequality is actually weaker than the four mai...

متن کامل

Semi-smooth Second-order Type Methods for Composite Convex Programs

The goal of this paper is to study approaches to bridge the gap between first-order and second-order type methods for composite convex programs. Our key observations are: i) Many well-known operator splitting methods, such as forward-backward splitting (FBS) and Douglas-Rachford splitting (DRS), actually define a possibly semi-smooth and monotone fixed-point mapping; ii) The optimal solutions o...

متن کامل

Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-\L{}ojasiewicz Condition

In 1963, Polyak proposed a simple condition that is sufficient to show a global linear convergence rate for gradient descent. This condition is a special case of the Lojasiewicz inequality proposed in the same year, and it does not require strong convexity (or even convexity). In this work, we show that this much-older PolyakLojasiewicz (PL) inequality is actually weaker than the main condition...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006